3,379 research outputs found

    A visualisation and simulation framework for local and remote HRI experimentation

    Get PDF
    In this text, we will present work on the design and development of a ROS-based (Robot Operating System) remote 3D visualisation, control and simulation framework. This architecture has the purpose of extending the usability of a system devised in previous work by this research team during the CASIR (Coordinated Attention for Social Interaction with Robots) project. The proposed solution was implemented using ROS, and designed to attend the needs of two user groups – local and remote users and developers. The framework consists of: (1) a fully functional simulator integrated with the ROS environment, including a faithful representation of a robotic platform, a human model with animation capabilities and enough features for enacting human robot interaction scenarios, and a virtual experimental setup with similar features as the real laboratory workspace; (2) a fully functional and intuitive user interface for monitoring and development; (3) a remote robotic laboratory that can connect remote users to the framework via a web browser. The proposed solution was thoroughly and systematically tested under operational conditions, so as to assess its qualities in terms of features, ease-of-use and performance. Finally, conclusions concerning the success and potential of this research and development effort are drawn, and the foundations for future work will be proposed

    Touch attention Bayesian models for robotic active haptic exploration of heterogeneous surfaces

    Get PDF
    This work contributes to the development of active haptic exploration strategies of surfaces using robotic hands in environments with an unknown structure. The architecture of the proposed approach consists two main Bayesian models, implementing the touch attention mechanisms of the system. The model πper perceives and discriminates different categories of materials (haptic stimulus) integrating compliance and texture features extracted from haptic sensory data. The model πtar actively infers the next region of the workspace that should be explored by the robotic system, integrating the task information, the permanently updated saliency and uncertainty maps extracted from the perceived haptic stimulus map, as well as, inhibition-of-return mechanisms. The experimental results demonstrate that the Bayesian model πper can be used to discriminate 10 different classes of materials with an average recognition rate higher than 90%. The generalization capability of the proposed models was demonstrated experimentally. The ATLAS robot, in the simulation, was able to perform the following of a discontinuity between two regions made of different materials with a divergence smaller than 1cm (30 trials). The tests were performed in scenarios with 3 different configurations of the discontinuity. The Bayesian models have demonstrated the capability to manage the uncertainty about the structure of the surfaces and sensory noise to make correct motor decisions from haptic percepts
    • …
    corecore